se.cs.ieu.edu.tr
Course Name | |
Code | Semester | Theory (hour/week) | Application/Lab (hour/week) | Local Credits | ECTS |
---|---|---|---|---|---|
Fall/Spring |
Prerequisites | None | |||||
Course Language | ||||||
Course Type | Elective | |||||
Course Level | - | |||||
Mode of Delivery | - | |||||
Teaching Methods and Techniques of the Course | Problem SolvingCase Study | |||||
Course Coordinator | ||||||
Course Lecturer(s) | ||||||
Assistant(s) | - |
Course Objectives | |
Learning Outcomes | The students who succeeded in this course;
|
Course Description |
| Core Courses | |
Major Area Courses | ||
Supportive Courses | X | |
Media and Managment Skills Courses | ||
Transferable Skill Courses |
Week | Subjects | Required Materials |
1 | Biological motivation. Historical remarks on artificial neural networks. Applications of artificial neural networks. A taxonomy of artificial neural network models and learning algorithms. | Introduction. S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761. Lecture Notes. |
2 | General artificial neuron model. Discretevalued perceptron model, threshold logic and their limitations. Discretetime (dynamical) Hopfield networks. Hebb’s rule. Connection wieght matrix as an outer product of memory patterns. | Chapter 1. S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761. Lecture Notes. |
3 | Supervised learning. Perceptron learning algorithm. Adaptive linear element. Supervised learning as output error minimization problem. Gradient descent algorithm for minimization. Least mean square rule. | Chapter 2. S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761. Lecture Notes. |
4 | Single layer, continuous valued perceptron. Nonlinear (sigmoidal) activation function. Delta rule. Batch mode and pattern mode gradient descent algorithms. Convergence conditions for deterministic and stochastic gradient descent algorithms. | Chapter 3. Chapter 4: Sections 4.1, 4.2, 4.16. S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761. Lecture Notes. |
5 | Multi layer perceptron as universal approximator. Function representation and approximation problems. Backpropagation Learning. Local minima problem. Overtraining. | Chapter 4: Sections 4.4, 4.5, 4.8, 4.10, 4.12. S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761. Lecture Notes. |
6 | Midterm Exam I. Batch and pattern mode training. Training set versus test set. Overfitting problem. General practices for network training and testing. Signal processing and pattern recognition applications of multilayer perceptrons. | Chapter 4: Sections 4.3, 4.10., 4.11, 4.13, 4.14, 4.15, 4.19, 4.20. S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761. Lecture Notes. |
7 | Radial Basis Function (RBF) network. Backpropagation learning for determining linear weights, centers and widths parameters of RBF networks. Random selection of centers. Input versus inputoutput clustering for center and width determination. Regularization theory, mixture of Gaussion (conditional probability density function) model and neurofuzzy connections of RBF networks. | Chapter 5. S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761 Lecture Notes. |
8 | Support vector machines for classification. Kernel representations. Generalization ability. VapnikChervonenkis dimension. Support vector regression. Comparison of different kernels, loss (error) functions and norms for (separating hyerplane) flatness. | Chapter 6. S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761 Lecture Notes. |
9 | Parametric versus nonparametric methods for data representation. Unsupervised learning as a vector quantization problem. Competetive networks. Winner take all network. Kohonen’s selforganizing feature map. Clustering. | Chapter 9. S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761. Lecture Notes. |
10 | Continuous time Hopfield networks. Stability analysis of multiple equilibria of Hopfield networks. Hopfield networks for cost minimization: Lyapunov (energy) based design of Hopfield networks. Associative memory. Traveling salesman problem. Combinatorial optimization. | Chapter 13: Sections 13.1, 13.2, 13.3, 13.4, 13.5, 13.6, 13.7, S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761. Lecture Notes. |
11 | Midterm Exam II. Signal processing applications of artificial neural networks. Principle component analysis. Data compression and reduction. Image and 1D signal compression and transformation applications of artificial neural networks. | Chapter 8. S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761 |
12 | Pattern recognition applications of artificial neural networks. Artificial neural networks for feature extraction. Nonlinear feature mapping. Data fusion. Artificial neural networks as classifiers. Image and speech recognition applications. | Sections 1.4,1.5., 3.11, 4.7, 5.8, 6.7, S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761. Lecture Notes. |
13 | Control applications of artificial neural networks. Artificial neural networks for system identification. Artificial neural networks as controllers. Inverse systems design. Direct and indirect control methods. Adaptive control applications. | Chapter 15: Section 15.3. S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761. Lecture Notes. |
14 | Implementation of artificial neural networks models and associated learning algorithms for signal processing, pattern recognition and control in MATLAB numerical software environment. | Lecture Notes. |
15 | Cumulative review of artificial neural networks models, learning algorithms and their applications. | S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761. Lecture Notes. |
16 | Review of the Semester |
Course Notes/Textbooks | S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761 |
Suggested Readings/Materials | J. M. Zurada, Int. To Artificial Neural Systems, West Publishing Company, 1992 ISBN 053495460X, 9780534954604. |
Semester Activities | Number | Weigthing |
Participation | ||
Laboratory / Application | ||
Field Work | ||
Quizzes / Studio Critiques | ||
Portfolio | ||
Homework / Assignments | 5 | 20 |
Presentation / Jury | ||
Project | 1 | 30 |
Seminar / Workshop | ||
Oral Exam | ||
Midterm | 2 | 50 |
Final Exam | ||
Total |
Weighting of Semester Activities on the Final Grade | 100 | |
Weighting of End-of-Semester Activities on the Final Grade | ||
Total |
Semester Activities | Number | Duration (Hours) | Workload |
---|---|---|---|
Course Hours (Including exam week: 16 x total hours) | 16 | 3 | 48 |
Laboratory / Application Hours (Including exam week: 16 x total hours) | 16 | ||
Study Hours Out of Class | 15 | 1 | |
Field Work | |||
Quizzes / Studio Critiques | |||
Portfolio | |||
Homework / Assignments | 5 | 3 | |
Presentation / Jury | |||
Project | 1 | 24 | |
Seminar / Workshop | |||
Oral Exam | |||
Midterms | 2 | 9 | |
Final Exams | |||
Total | 120 |
# | Program Competencies/Outcomes | * Contribution Level | ||||
1 | 2 | 3 | 4 | 5 | ||
1 | Be able to define problems in real life by identifying functional and nonfunctional requirements that the software is to execute | |||||
2 | Be able to design and analyze software at component, subsystem, and software architecture level | |||||
3 | Be able to develop software by coding, verifying, doing unit testing and debugging | |||||
4 | Be able to verify software by testing its behaviour, execution conditions, and expected results | |||||
5 | Be able to maintain software due to working environment changes, new user demands and the emergence of software errors that occur during operation | |||||
6 | Be able to monitor and control changes in the software, the integration of software with other software systems, and plan to release software versions systematically | |||||
7 | To have knowledge in the area of software requirements understanding, process planning, output specification, resource planning, risk management and quality planning | |||||
8 | Be able to identify, evaluate, measure and manage changes in software development by applying software engineering processes | |||||
9 | Be able to use various tools and methods to do the software requirements, design, development, testing and maintenance | X | ||||
10 | To have knowledge of basic quality metrics, software life cycle processes, software quality, quality model characteristics, and be able to use them to develop, verify and test software | |||||
11 | To have knowledge in other disciplines that have common boundaries with software engineering such as computer engineering, management, mathematics, project management, quality management, software ergonomics and systems engineering | X | ||||
12 | Be able to grasp software engineering culture and concept of ethics, and have the basic information of applying them in the software engineering | X | ||||
13 | Be able to use a foreign language to follow related field publications and communicate with colleagues | X |
*1 Lowest, 2 Low, 3 Average, 4 High, 5 Highest